Modeling of Item-Difficulty for Ontology-based MCQs

نویسندگان

  • Vinu E. V
  • Tahani Alsubait
  • P. Sreenivasa Kumar
چکیده

Multiple choice questions (MCQs) that can be generated from a domain ontology can significantly reduce human effort & time required for authoring & administering assessments in an e-Learning environment. Even though there are various methods for generating MCQs from ontologies, methods for determining the difficulty-levels of such MCQs are less explored. In this paper, we study various aspects and factors that are involved in determining the difficultyscore of an MCQ, and propose an ontology-based model for the prediction. This model characterizes the difficulty values associated with the stem and choice set of the MCQs, and describes a measure which combines both the scores. Furthermore, the notion of assigning difficultly-scores based on the skill level of the test taker is utilized for predicating difficulty-score of a stem. We studied the effectiveness of the predicted difficulty-scores with the help of a psychometric model from the Item Response Theory, by involving real-students and domain experts. Our results show that, the predicated difficulty-levels of the MCQs are having high correlation with their actual difficulty-levels.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reviewing the results of qualitative and quantitative analysis of MCQs in Introduction to clinical medicine course

Background: At most of the medical universities, MCQ-based examinations are often use as the first component, particularly to ensure the candidates have an adequate knowledge base, prior to entering subsequent clinical examinations. Method: In ICM , 3973 MCQs were collected from tests over a four-year period from 2005 to2009. Questions were evaluated for 10 frequently occurring item writing f...

متن کامل

Automated Generation of Assessment Tests from Domain Ontologies

The objective of this paper is to investigate the scope of OWL-DL ontologies in generating multiple choice questions (MCQs) that can be employed for conducting large scale assessments, and to conduct a detailed study on the effectiveness of the generated assessment items, using principles in the Item Response Theory (IRT). The details of a prototype system called Automatic Test Generation (ATG)...

متن کامل

A Survey on Distractors in Multiple-choice Questions and its Relationship on Difficulty and Discriminative Indices

Introduction: The distractors have a very important role in Multiple-Choice Question (MCQ) and it can influence the quality of the tests. This study aimed to investigate frequency of functioning and non-functioning distractors and the relationship between the distractors options with difficulty and discrimination in Ahvaz Jundishapur University of Medical sciences in 2017. Methods: In this desc...

متن کامل

Automated generation of assessment tests from domain ontologies

We investigate the effectiveness of OWL-DL ontologies in generating multiple choice questions (MCQs) that can be employed for conducting large scale assessments. The details of a prototype system called Automatic Test Generation (ATG) system and its extended version called Extended-ATG system are elaborated in this paper. The ATG system was useful in generating multiple choice question-sets of ...

متن کامل

Reliability of a 25-item low-stakes multiple-choice assessment of bronchoscopic knowledge.

BACKGROUND A need for improved patient safety, quality of care, and accountability has prompted the development of competency-based educational processes. Assessment tools related to bronchoscopy training, however, have not yet been developed or validated. PURPOSES To determine whether 25 multiple-choice questions (MCQs) extracted from the free, Web-based Essential Bronchoscopist (EB) learnin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1607.00869  شماره 

صفحات  -

تاریخ انتشار 2016